AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Dutch language optimization

# Dutch language optimization

Llama 3 ChocoLlama 8B Instruct
A Dutch language instruction-optimized large model based on Llama-3-8B, fine-tuned through SFT and DPO on multiple Dutch instruction datasets
Large Language Model Transformers Other
L
ChocoLlama
268
6
Fietje 2
MIT
Fietje 2 is a Dutch-optimized version based on microsoft/phi-2, trained on 28 billion Dutch tokens to specifically enhance Dutch text generation capabilities
Large Language Model Transformers Other
F
BramVanroy
230
9
Chocollama 2 7B Base
A Dutch-adapted version based on Meta Llama-2-7b, fine-tuned using LoRa technology with 32B Dutch Llama-2 tokens as the base model
Large Language Model Transformers Other
C
ChocoLlama
26
2
Geitje 7B
Apache-2.0
A large-scale open-source Dutch language model based on Mistral 7B, significantly enhancing Dutch language comprehension and local knowledge coverage through incremental training on 10 billion Dutch text tokens.
Large Language Model Transformers Other
G
Rijgersberg
891
19
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase